106 research outputs found

    Pharmaceutical standardization and physicochemical characterization of traditional ayurvedic mineral drug red ochre roasted in cow‘s ghee (Shuddha Gairik)

    Get PDF
    303-316Rasashastra is a pharmaceutics branch of Ayurveda describing scientific methods to convert metals and minerals into bio-compatible formulations used individually or plant admixed to enhance their bioavailability and efficacy. In the present study, red ochre (Gairik) powder was processed in cow‘s ghee by textual method of roasting. The steps of preparation and changes in the properties therein were documented and validated in triplicate batches to develop a monograph. Ayurvedic and basic tests were performed to determine the properties of Shuddha Gairik. The physical characterization included Scanning Electron Microscopy (SEM), X-ray Diffraction (XRD), Fourier Transform Infra-red (FT-IR), Dynamic Light Scattering (DLS) and Thermo-gravimetric Analysis (TGA). Elemental composition was estimated by titration and gravimetric analysis while heavy metal limits were assessed using Inductively Coupled Plasma Optical Emission Spectrophotometry (ICPOES). This study depicted that crude red ochre, containing Kaolinite with high iron percentage, on roasting in cow‘s ghee led to the formation of fatty acids adsorbed red ochre particles. The developed monograph will be a guideline to the Ayurvedic industry for precise formulation of Shuddha Gairik. This will help researchers for better understanding the importance of Ayurvedic methods of pharmaceutical preparations and carry out their mechanistic studies in various diseases

    Influence of warming and atmospheric circulation changes on multidecadal European flood variability

    Get PDF
    International audienceEuropean flood frequency and intensity change on a multidecadal scale. Floods were more frequent in the 19th (central Europe) and early 20th century (western Europe) than during the mid-20th century and again more frequent since the 1970s. The causes of this variability are not well understood and the relation to climate change is unclear. Palaeoclimate studies from the northern Alps suggest that past flood-rich periods coincided with cold periods. In contrast, some studies suggest that more floods might occur in a future, warming world. Here we address the contribution of atmospheric circulation and of warming to multidecadal flood variability. For this, we use long series of annual peak streamflow, daily weather data, reanalyses, and reconstructions. We show that both changes in atmospheric circulation and moisture content affected multidecadal changes of annual peak streamflow in central and western Europe over the past two centuries. We find that during the 19th and early 20th century, atmospheric circulation changes led to high peak values of moisture flux convergence. The circulation was more conducive to strong and long-lasting precipitation events than in the mid-20th century. These changes are also partly reflected in the seasonal mean circulation and reproduced in atmospheric model simulations, pointing to a possible role of oceanic variability. For the period after 1980, increasing moisture content in a warming atmosphere led to extremely high moisture flux convergence. Thus, the main atmospheric driver of flood variability changed from atmospheric circulation variability to water vapour increase.La fréquence et l'intensité des inondations en Europe changent à une échelle multidécennale. Les inondations étaient plus fréquentes au 19ème (Europe centrale) et au début du 20ème siècle (Europe occidentale) qu'au milieu du 20ème siècle et à nouveau plus fréquentes depuis les années 1970. Les causes de cette variabilité ne sont pas bien comprises et la relation avec le changement climatique n'est pas claire. Les études paléoclimatiques menées dans les Alpes du Nord suggèrent que les périodes passées riches en inondations coïncidaient avec des périodes froides. En revanche, certaines études suggèrent que davantage d'inondations pourraient se produire dans un monde futur en réchauffement. Nous abordons ici la contribution de la circulation atmosphérique et du réchauffement à la variabilité multidécennale des inondations. Pour cela, nous utilisons de longues séries de débit maximal annuel, des données météorologiques quotidiennes, des réanalyses et des reconstructions climatiques. Nous montrons que les changements de la circulation atmosphérique et du contenu en humidité ont affecté les changements multidécennaux du débit maximal annuel en Europe centrale et occidentale au cours des deux derniers siècles. Nous constatons qu'au cours du 19ème et du début du 20ème siècle, les changements de la circulation atmosphérique ont conduit à des valeurs de pointe élevées de convergence du flux d'humidité. La circulation était plus propice à des événements de précipitations forts et durables qu'au milieu du 20e siècle. Ces changements se reflètent également en partie dans la circulation moyenne saisonnière et sont reproduits dans les simulations des modèles atmosphériques, ce qui indique un rôle possible de la variabilité océanique. Pour la période après 1980, l'augmentation de la teneur en humidité dans une atmosphère qui se réchauffe a conduit à une convergence extrêmement élevée des flux d'humidité. Ainsi, le principal moteur atmosphérique de la variabilité des crues est passé de la variabilité de la circulation atmosphérique à l'augmentation de la vapeur d'eau

    Multilayered feed forward Artificial Neural Network model to predict the average summer-monsoon rainfall in India

    Full text link
    In the present research, possibility of predicting average summer-monsoon rainfall over India has been analyzed through Artificial Neural Network models. In formulating the Artificial Neural Network based predictive model, three layered networks have been constructed with sigmoid non-linearity. The models under study are different in the number of hidden neurons. After a thorough training and test procedure, neural net with three nodes in the hidden layer is found to be the best predictive model.Comment: 19 pages, 1 table, 3 figure

    Rossby wave dynamics of the North Pacific extra-tropical response to El Niño: importance of the basic state in coupled GCMs

    Get PDF
    The extra-tropical response to El Nino in a "low" horizontal resolution coupled climate model, typical of the Intergovernmental Panel on Climate Change fourth assessment report simulations, is shown to have serious systematic errors. A high resolution configuration of the same model has a much improved response that is similar to observations. The errors in the low resolution model are traced to an incorrect representation of the atmospheric teleconnection mechanism that controls the extra-tropical sea surface temperatures (SSTs) during El Nino. This is due to an unrealistic atmospheric mean state, which changes the propagation characteristics of Rossby waves. These erroneous upper tropospheric circulation anomalies then induce erroneous surface circulation features over the North Pacific. The associated surface wind speed and direction errors create erroneous surface flux and upwelling anomalies which finally lead to the incorrect extra-tropical SST response to El Nino in the low resolution model. This highlights the sensitivity of the climate response to a single link in a chain of complex climatic processes. The correct representation of these processes in the high resolution model indicates the importance of horizontal resolution in resolving such processes

    Task-based Parallel Computation of the Density Matrix in Quantum-based Molecular Dynamics using Graph Partitioning

    Get PDF
    Quantum-based molecular dynamics (QMD) is a highly accurate and transferable method for material science simulations. However, the time scales and system sizes accessible to QMD are typically limited to picoseconds and a few hundred atoms. These constraints arise due to expensive self-consistent ground-state electronic structure calculations that can often scale cubically with the number of atoms. Linearly scaling methods depend on computing the density matrix P from the Hamiltonian matrix H by exploiting the sparsity in both matrices. The second-order spectral projection (SP2) algorithm is an O(N) algorithm that computes P with a sequence of 40-50 matrix-matrix multiplications. In this paper, we present task-based implementations of a recently developed data-parallel graph-based approach to the SP2 algorithm, G-SP2. We represent the density matrix P as an undirected graph and use graph partitioning techniques to divide the computation into smaller independent tasks. The partitions thus obtained are generally not of equal size and give rise to undesirable load imbalances in standard MPI-based implementations. This load-balancing challenge can be mitigated by dynamically scheduling parallel computations at runtime using task-based programming models. We develop task-based implementations of the data-parallel G-SP2 algorithm using both Intel's Concurrent Collections (CnC) as well as the Charm++ programming model and evaluate these implementations for future use. Scaling and performance results of our implementations are investigated for representative segments of QMD simulations for solvated protein systems containing more than 10,000 atoms

    An evaluation of the performance of the twentieth century reanalysis version 3

    Get PDF
    The performance of a new historical reanalysis, the NOAA–CIRES–DOE Twentieth Century Reanalysis version 3 (20CRv3), is evaluated via comparisons with other reanalyses and independent observations. This dataset provides global, 3-hourly estimates of the atmosphere from 1806 to 2015 by assimilating only surface pressure observations and prescribing sea surface temperature, sea ice concentration, and radiative forcings. Comparisons with independent observations, other reanalyses, and satellite products suggest that 20CRv3 can reliably produce atmospheric estimates on scales ranging from weather events to long-term climatic trends. Not only does 20CRv3 recreate a ‘‘best estimate’’ of the weather, including extreme events, it also provides an estimate of its confidence through the use of an ensemble. Surface pressure statistics suggest that these confidence estimates are reliable. Comparisons with independent upper-air observations in the Northern Hemisphere demonstrate that 20CRv3 has skill throughout the twentieth century. Upper-air fields from 20CRv3 in the late twentieth century and early twenty-first century correlate well with full-input reanalyses, and the correlation is predicted by the confidence fields from 20CRv3. The skill of analyzed 500-hPa geopotential heights from 20CRv3 for 1979–2015 is comparable to that of modern operational 3–4-day forecasts. Finally, 20CRv3 performs well on climate time scales. Long time series and multidecadal averages of mass, circulation, and precipitation fields agree well with modern reanalyses and station- and satellite-based products. 20CRv3 is also able to capture trends in tropospheric-layer temperatures that correlate well with independent products in the twentieth century, placing recent trends in a longer historical context.The research work of R. Przybylak and P. Wyszynski was supported by the National Science Centre, Poland (Grants DEC-2012/07/B/ST10/04002 and 2015/19/B/ST10/02933)

    Stochastic climate theory and modeling

    Get PDF
    Stochastic methods are a crucial area in contemporary climate research and are increasingly being used in comprehensive weather and climate prediction models as well as reduced order climate models. Stochastic methods are used as subgrid-scale parameterizations (SSPs) as well as for model error representation, uncertainty quantification, data assimilation, and ensemble prediction. The need to use stochastic approaches in weather and climate models arises because we still cannot resolve all necessary processes and scales in comprehensive numerical weather and climate prediction models. In many practical applications one is mainly interested in the largest and potentially predictable scales and not necessarily in the small and fast scales. For instance, reduced order models can simulate and predict large-scale modes. Statistical mechanics and dynamical systems theory suggest that in reduced order models the impact of unresolved degrees of freedom can be represented by suitable combinations of deterministic and stochastic components and non-Markovian (memory) terms. Stochastic approaches in numerical weather and climate prediction models also lead to the reduction of model biases. Hence, there is a clear need for systematic stochastic approaches in weather and climate modeling. In this review, we present evidence for stochastic effects in laboratory experiments. Then we provide an overview of stochastic climate theory from an applied mathematics perspective. We also survey the current use of stochastic methods in comprehensive weather and climate prediction models and show that stochastic parameterizations have the potential to remedy many of the current biases in these comprehensive models

    On northern-hemisphere wave patterns associated with winter rainfall events in China

    Get PDF
    During extended winter (November-April) 43% of the intraseasonal rainfall variability in China is explained by three spatial patterns of temporally coherent rainfall. These patterns were identified with Empirical Orthogonal Teleconnection (EOT) analysis of observed 1982-2007 pentad rainfall anomalies and connected to midlatitude disturbances. However, examination of individual strong EOT events shows that there is substantial inter-event variability in their dynamical evolution, which implies that precursor patterns found in regressions cannot serve as useful predictors. To understand the physical nature and origins of the extratropical precursors, the EOT technique is applied to six simulations of the Met Office Unified Model at horizontal resolutions of 200--40 km and with and without air-sea coupling. All simulations reproduce the observed precursor patterns in regressions, indicating robust underlying dynamical processes. Further investigation into the dynamics associated with observed patterns shows that Rossby wave dynamics can explain the large inter-event variability. The results suggest that the apparently slowly evolving or quasi-stationary waves in regression analysis are a statistical amalgamation of more rapidly propagating waves with a variety of origins and properties

    Employing Relative Entropy Techniques for Assessing Modifications in Animal Behavior

    Get PDF
    In order to make quantitative statements regarding behavior patterns in animals, it is important to establish whether new observations are statistically consistent with the animal's equilibrium behavior. For example, traumatic stress from the presence of a telemetry transmitter may modify the baseline behavior of an animal, which in turn can lead to a bias in results. From the perspective of information theory such a bias can be interpreted as the amount of information gained from a new measurement, relative to an existing equilibrium distribution. One important concept in information theory is the relative entropy, from which we develop a framework for quantifying time-dependent differences between new observations and equilibrium. We demonstrate the utility of the relative entropy by analyzing observed speed distributions of Pacific bluefin tuna, recorded within a 48-hour time span after capture and release. When the observed and equilibrium distributions are Gaussian, we show that the tuna's behavior is modified by traumatic stress, and that the resulting modification is dominated by the difference in central tendencies of the two distributions. Within a 95% confidence level, we find that the tuna's behavior is significantly altered for approximately 5 hours after release. Our analysis reveals a periodic fluctuation in speed corresponding to the moment just before sunrise on each day, a phenomenon related to the tuna's daily diving pattern that occurs in response to changes in ambient light

    An Evaluation of the Performance of the Twentieth Century Reanalysis Version 3

    Get PDF
    The performance of a new historical reanalysis, the NOAA–CIRES–DOE Twentieth Century Reanalysis version 3 (20CRv3), is evaluated via comparisons with other reanalyses and independent observations. This dataset provides global, 3-hourly estimates of the atmosphere from 1806 to 2015 by assimilating only surface pressure observations and prescribing sea surface temperature, sea ice concentration, and radiative forcings. Comparisons with independent observations, other reanalyses, and satellite products suggest that 20CRv3 can reliably produce atmospheric estimates on scales ranging from weather events to long-term climatic trends. Not only does 20CRv3 recreate a “best estimate” of the weather, including extreme events, it also provides an estimate of its confidence through the use of an ensemble. Surface pressure statistics suggest that these confidence estimates are reliable. Comparisons with independent upper-air observations in the Northern Hemisphere demonstrate that 20CRv3 has skill throughout the twentieth century. Upper-air fields from 20CRv3 in the late twentieth century and early twenty-first century correlate well with full-input reanalyses, and the correlation is predicted by the confidence fields from 20CRv3. The skill of analyzed 500-hPa geopotential heights from 20CRv3 for 1979–2015 is comparable to that of modern operational 3–4-day forecasts. Finally, 20CRv3 performs well on climate time scales. Long time series and multidecadal averages of mass, circulation, and precipitation fields agree well with modern reanalyses and station- and satellite-based products. 20CRv3 is also able to capture trends in tropospheric-layer temperatures that correlate well with independent products in the twentieth century, placing recent trends in a longer historical context
    corecore